Asymptotics

#econometrics #economics

Oh, Hyunzi. (email: wisdom302@naver.com)
Korea University, Graduate School of Economics.
2024 Spring, instructed by prof. Kim, Dukpa.


Main References

  • Kim, Dukpa. (2024). "Econometric Analysis" (2024 Spring) ECON 518, Department of Economics, Korea University.
  • Davidson and MacKinnon. (2021). "Econometric Theory and Methods", Oxford University Press, New York.

In Inferences in Linear Regression, we have assumed that the residuals given the regressors is multivariate normal, i.e. However, in practice, this assumption is not justifiable very well. Thus, we assume asymptotic approximations in the standard econometrics instead. Here, we introduce asymptotic concepts for the further analysis.

Law of Large Numbers

Weak LLM on IID

Theorem (weak law of large numbers).

Let be a sequence of random variables where Then, we have

Proof. Note that since , we have and from the assumption that , and , we have where the third equation holds by the independence of the random variables, i.e. for all .

Remark (weak LLN and smlim, plim).

Define Then under the assumption of Theorem 1 (weak law of large numbers), we have thus the weak LLN can be alternatively driven by

Proof.By the assumption that and , we have and the variance isThus, we have where the fifth equality holds since also follows by the independence between and for all : Therefore, we have . The rest of the proof follows Theorem 1 (weak law of large numbers).

Strong LLM on IID

Remark (bounded expectation and probability of).

Let be a random variable. If the probability that has finite value is less then , then the expected value of is infinity: Equivalently, if the expected value of is finite, then must have finite value with probability :

Theorem (strong law of large numbers).

Let be a sequence of random variables where Then, we have

Proof.From Convergence of Random Variables > Definition 7 (almost surely converges), if i.e. Similar to the proof of Remark 2 (weak LLN and smlim, plim), define then, we have Also, using Remark 3 (bounded expectation and probability of) and the assumption of , we have Therefore, it is sufficient to show that is finite.

Note that Since is independent to each other, we claim that and denote Then, we have Now, we have Therefore, we have , which completes the proof.

Remark (finite second moment is not sufficient for strong LLN).

Assuming and by showing is finite, while we still have it can be shown that finite second-moment it is not sufficient enough to show the desired result: where Thus the finite fourth-moment is required.

Other Versions of LLM

Theorem (Kolmogorov's Theorem).

Let be a sequence of IID random variables. Then if and only if and .

Here, LLN assumes the three types of regularity conditions:

  1. independence across ,
  2. no heterogeneity,
  3. finite first moment.
Theorem (McLeish's Theorem).

Let be a sequence of random variables with finite means , and suppose where . If is , , then we have

Central Limit Theorem

Classical CLT

Before proving Theorem 8 (Lindeberg-Levy CLT), remark the definition.

Definition 9 (moment-generating function).

The moment-generating function (MGF) of a random variable is if is a random vector, then the moment-generating function is

Finally, we formally prove the Central Limit Theorem (CLT).

Theorem (Lindeberg-Levy CLT).

Let be a sequence of random variables where and assume that the moment-generating function for each exists: Then, we have where is the standardized quantity and .

Proof.Define And since follows where and , we have Note that the MGF of is and by Statistical Proof > Proposition 13 (linear combination of MGF) and Statistical Proof > Proposition 12 (linear transformation of MGF), we have where the last equality holds since MGF of is identical for all .

Thus, we have Where the last term is the MGF of standard normal distribution. Since Statistical Proof > Lemma 23 (MGF of multivariate normal distribution), If , then Therefore, we have which completes the proof.

Note that we can alternatively use Statistical Proof > Definition 14 (characteristic function) to prove Theorem 8 (Lindeberg-Levy CLT), since MGF may not exists for some random variables (while CF always exists for any random variables).

More compact version of Theorem 8 (Lindeberg-Levy CLT) is:

Theorem (Lindeberg-Levy CLT2).

Let be a sequence of random variables with Then, if , we have where

However, assumption in Theorem 8 (Lindeberg-Levy CLT) can be eased in some sense:

Theorem (Lyapunov CLT).

Let be a sequence of random variables where now define If for some , is satisfied, then we have i.e. the sum of converges in distribution to a standard normal distribution as goes to infinity.

CLT of Student-T Distribution

Corollary (CLT of student-t distribution).

Let the -value as where Then we have

CLT for Joint Distributions

Theorem (Cramer-Wold device).

Let be a sequence of random vectors and suppose that for any real vector such that where is vector with joint distribution function . Then, the limiting distribution of exists and equals to .

Remark (CLT of random vector).

Let be a sequence of random vectors, generated from Now consider , then we have Thus, for a vector such that , is a sequence of random variables such that Using Theorem 9 (Lindeberg-Levy CLT2), we have Therefore, by Theorem 12 (Cramer-Wold device), we have or

Delta Method

Proposition (delta method).

Let be a sequence of random variables satisfying where is finite valued constant. Then we have for any function such that .

Big O and small o

Definition (O and o).

For the sequence of random variables , the small o denotes when and the big O denotes when where and can be zero.

Example (small o and big O).

Let . then we have sinceand we say since

Proposition (small o implies big O).

If , then we have . However, the converse does not always hold.

Proof.Assume , then we have thus .

Definition (Op and op).

For the sequence of random variables , we have and we say if

Example (big O and small o in probability).

We have , if and if

Proposition (properties of op and Op).
  • .